Amid concerns about online privacy, people willing to share more data
Loading...
| Cambridge, Mass.
Cybercriminals recently stole a large mass of data from Verizon’s enterprise computing arm and put it up for sale online, a brazen act that comes in the midst of a swirling debate about online privacy.
The data, which included “basic contact information” for about 1.5 million customers of Verizon Enterprise Solutions, which provides IT services to major companies, was later put for sale for $100,000 – or $10,000 for 100,000 records.
Verizon said the breach, which was first reported by security journalist Brian Krebs, didn’t include any proprietary information about its customers, who were being notified about the hacking.
But for ordinary consumers, data they reveal online can also be mined in increasingly sophisticated ways to use in marketing schemes that can be more harmful, including encouraging people to take on debt they can’t afford.
Increasingly, researchers say, the majority of sites – from social media to online shopping – that collect a variety of information about their users has led to a "privacy paradox": People say they are concerned about privacy but often display a willingness to share a wealth of information online.
"Expanding innovation while making sure that people don’t get hurt or unfairly disadvantaged along the way, these are not mutually exclusive," said Massachusetts Attorney General Maura Healey on Thursday, during a panel discussion on data privacy at the Massachusetts Institute of Technology in Cambridge, Mass.
One example is the unauthorized sale of data to "lead generators," companies who can use information they obtain about potential customers – such as their race or neighborhood ZIP Code – to market predatory home lending or education debt schemes.
Cracking down on data brokers that sell customers' information without their permission for use in unsavory marketing efforts is an increasing challenge for law enforcement, Ms. Healey said.
"It's important to us that we take steps to guard against potential abuse … to make sure that data-driven innovations are a benefit for everyone, including the most vulnerable," she told an audience of academics, lawyers, and representatives of tech companies such as Facebook and Google.
For example, even doing a Google search for a name can sometimes yield ads with a racial bias and false information, said Latanya Sweeney, a professor of government and technology at Harvard University.
In 2013, she discovered that Google searches for names identified as black were 80 percent more likely to yield ads that promised to unveil a person’s arrest record than names identified as white.
Ms. Sweeney, formerly the chief technologist at the Federal Trade Commission, found the trend inadvertently.
While searching online for a paper she had written that she wanted to show a colleague, she discovered an ad that promised, "Latanya Sweeney, get her arrest record."
But she had never been arrested, and later paid the background check company, Instant Checkmate, to discover that no arrest record existed for her name.
By testing more than 2,100 names of real people in searches on Google and other sites that included ads delivered by Google, she found that names that sounded black were 25 percent more likely to produce an ad for a criminal record than names that sounded white – even if the person didn't have an arrest record. That led to the conclusion that Google’s AdWords service could be creating the discriminatory results.
"Initially, I was like, 'Hey, I’m a computer scientist, computers don’t do that,' " she told the audience.
Google, which provided partial funding for Sweeney’s research, had denied that its service was responsible, noting that it barred ads that advocated against a person or group, while "it is up to individual advertisers to decide which keywords they want to choose to trigger their ads."
It wasn't clear exactly why the results appeared, Sweeney said. But, she added, "These kinds of design decisions have profound effects on our daily lives."
Some panelists said that with little movement from Congress on a proposed Consumer Privacy Bill of Rights, states could better take up the mantle of protecting consumers from abuses of their data through more targeted actions that addressed specific issues, such as how to handle student records.
Representatives from tech companies argued that having a "patchwork" of state regulations could lead to confusion.
"You end up, as a lawyer, trying to comply with 50 different individual privacy laws," said John Doherty, vice president of state policy and politics at TechNet, an industry trade group.
But as more schools have embraced the use of technology, issues of student privacy have also become a national concern.
On Wednesday, one parents' group pressed Congress to modernize the student privacy law known as FERPA (Family Educational Rights and Privacy Act), urging lawmakers to modify the law to allow parents to consent to the inclusion of student information in statewide databases that are often used for education research.
"Parents continue to seek answers to exactly what information pertaining to their children is being collected, who has access to the information and for what purpose, and when that information will be destroyed," Rachael Stickland, a Colorado parent who co-founded the Parents Coalition for Student Privacy, told members of the House Education Committee.
But education researchers said having access to statewide data – which is stripped of personal information before they receive it – is highly valuable for measuring how particular programs work.
Having an opt-out policy "means that when we get the data we wouldn’t be able to see if the program we're studying was beneficial, the data could be biased," said Jane Hannaway, a professor at Georgetown University’s McCourt School of Public Policy.
"It could be that middle class parents of adolescent boys didn’t want their children to opt-in to the study, and we couldn’t see how those programs benefit them," she added.
Last year, the Electronic Frontier Foundation alleged that Google was tracking student data without permission through its cloud-based education apps. The company argued that it hadn't gone back on a promise not to track students' information, but the watchdog filed a complaint with the Federal Trade Commission.
At the MIT event, consumer advocates said it was increasingly difficult for people to determine how their data could be used online, especially by obscure companies that lack the name recognition of many tech giants.
"It’s great that some companies, like Google and Facebook, have very public privacy information, but there are layers and layers of companies who are buying and selling this information whose names you’ve never heard of using algorithms in ways you could never even imagine," said Persis Yu, a staff attorney at the National Consumer Law Center.
"There’s an imbalance of – they know everything about the consumer, and the consumer doesn’t know probably anything about them," she said, "and it makes it very hard – from a consumer’s perspective – to try to protect themselves."